A blind person’s experiments with tech

L Subramani
The author uses an external keyboard while working on his laptop to prevent any unintentional contact with the touchpad which may display the cursor on the screen.
My friend Akhil and I travelled from Chennai to Bengaluru recently on a Vande Bharat train. Sipping the watery tea served on the train, Akhil, who is also blind like me, exclaimed, “This almost feels like travelling by a plane.” He found the layout of the train and placement of the handles blind-friendly. “They’ve installed Braille signs everywhere,” he added.
Hearing the excitement in Akhil’s voice, I decided to find out what my surroundings looked like. I picked up my smartphone, located the SeeingAI app, double tapped on the icon to open it, and clicked on its camera button to take a picture. “People are seated in rows of chairs in a train or flight,” the app described the scene to me in both text and audio format.
I use SeeingAI for everything from reading visiting cards to assessing obstacles on my way and determining if I am standing next to a man or a woman so that I can restrict my hand movement. It is designed for the blind and those who have low vision.
How accurate is SeeingAI? When I trained the camera on one of my colleagues during my Bengaluru trip, it said, “A 29-year-old woman with wavy hair and a happy face”. My colleague laughed and said, “I am 35.”
SeeingAI apart, I don’t need much tech assistance on train journeys. At most, I tune into music, movies, interviews, and cricket news on YouTube to fill the empty hours.
Phone demo
During this visit, I told my sighted colleagues that technology is increasingly becoming a leveller for the visually impaired. One asked me to demonstrate how I use my phone.
Every phone comes with an in-built screen reader. This assistive feature can read out the text on the screen through its own digital larynx called the Text to Speech engine. When you turn it on, it describes aloud what you are tapping, selecting and activating — from numbers and alphabets to ‘submit’, ‘next’ and ‘download’ buttons. Blind phone users like me follow these audio cues to swipe and tap till we get the job done.
So if I need something to eat in the middle of my work hour, I double tap the Swiggy or Zomato icons, listen in to the restaurants that show up for the dish I search, and pay via UPI apps. When this colleague of mine saw me hailing an auto from the Namma Yatri app, she said I could add a tip to incentivise drivers to accept a ride. I was delighted to know this. The greatest preoccupation of the blind community is trying to figure out how we can access products and services used by all, and not exclusively made for us.
Dialogue box disaster
Life was different when I went fully blind in 1991, aged 18. It was because of retinitis pigmentosa. It causes decay of the retinal cells, resulting in vision loss gradually. People doubted my ability to go to college. I needed human assistance to read voluminous books prescribed for my literature and journalism course, and to write my exams.
Questions were raised about my ability to join a fast-paced field like news media. Thanks to muscle memory, I could use a typewriter or a keyboard by myself. On autopilot, I could power up the desktop or use the keys necessary to open the word processor. But some tasks I could not crack. On several occasions, sighted members of my family told me the document opened in front of me was empty despite me typing non-stop. Because a ‘Save as’ dialogue box had silently come in the way of my work.
As an aspiring journalist, I could record interviews. I could glean visual information about an event from my interviewees. I could compose the lead paragraph of a news story mentally. But developing the story further and editing were a huge challenge. I needed sighted help to check and correct the article I had composed with anxiety.
Guiding voice
My life changed in 2003 after a chance meeting with Srinivasu C, an employee of the National Association for the Blind (Karnataka) in Bengaluru. “Try the demo version of this screen reader software,” he told me, handing me a compact disc loaded with Job Access With Speech, or JAWS.
From the time I inserted the disc into my computer, it began speaking. The voice guided me through the installation process, how to open a web browser, and google a few terms. I sent an email, and composed my articles without help for the first time. It felt as if someone had turned on a light bulb inside my head and I could suddenly see things. I forgot my blindness every time I worked on my system.
The voice came in British and American accents. I chose the former because I have grown up listening to the BBC. I could also customise the speed of the voice. I chose normal speed as that made me feel like I was interacting with a human being. Thanks to voice assistance, now I could track word count, fix spellings, and consult online dictionaries. I was ready for the newsroom.
It was how I bagged a job as a reporter-cum-subeditor at Deccan Herald’s Bengaluru office 20 years ago. My brother Prakash gifted me a full version of JAWS for Windows to mark the big day.
Smart move
I thought screen readers on the computer were the summit of the assistive device revolution but then came smartphones in the 2010s. The challenge, at least in the early days, was the same as in the desktop and laptop: one needed to instal third-party screen readers to access the phones. But now, iPhone comes with an in-house screen reader called Voiceover, and Android devices have a default TalkBack. Both work with 99.99% efficiency. Now phone companies are rolling out their brand of screen readers. Microsoft’s Narrator also comes inbuilt in PCs and laptops.
The first time I used a Nokia phone with voice software was in 2010. I hailed an auto to reach my office and turned on Loadstone GPS on my phone. It was a voice-based navigation app, designed exclusively for the blind. It described to me the route the driver was taking. “You should’ve gone by Primrose Road, but you’re going via Trinity Circle,” I told the driver. “How do you know?” the stunned driver replied.
Now, I could also stay up to date with football scores on Google News, and read the online version of the paper I work for.
The transition to touchscreen from button phones was smooth. My brother offered to buy me an iPhone 3GS if I demonstrated my ability to use it well. I practised on his iPhone, running my fingers on the screen as the voice commanded. I gathered the phone’s general layout, and my swipes and taps got more precise. “Here you go,” said my brother, handing me the 3GS phone in 10 days!
My earphones are always on me so as not to cause noise pollution. But I use only one of them. I keep one ear open so I can respond to someone calling out my name or saying a quick ‘hi’ even as the screen reader chirps instructions into my other ear like a parrot. Multitasking came with practice.
Financial freedom
I shudder thinking about the pre-UPI years when I had to take a sighted person to an ATM facility to withdraw money. Then there was the anxiety of losing the cash while taking out a handkerchief from the same pocket. The time I felt truly empowered was in 2015 when I took my first Uber ride and paid the driver via the Paytm wallet. Now drivers could not take extra money from me saying “No change, sir”. My financial freedom was slightly interrupted when the wallet maker forgot to integrate accessibility in an update version it released later. Times have changed. There are more UPI apps and they are handy. Earlier, I could recognise the ‘submit’ button on a UPI app only because I was told it was to the right of ‘0’ on the payment page. Now almost everything is labelled for the screen reader to read out.
New-age tech
My friend Akhil is an accessibility expert. He introduces me to exciting new tech that’s free to use. I use Siri, Apple’s AI-powered virtual assistant, extensively. ‘Hey Siri’ — I summon it to send text messages, calculate big numbers, look up the weather forecast, or play songs. I was familiar with speech recognition tech even before, having used Dictate on M S Word, and other speech-to-text apps on the phone to type using my voice. But Siri has become my bionic arm! I have Alexa-powered Amazon Echo Dot at home but it is a stationary virtual assistant unlike Siri that I can carry on the move.
Akhil also helped me find BeMyEyes, an app that connects a blind person with sighted volunteers around the world over live video calls. I once set out to make potato curry by myself. The vegetable was soggy to touch but smelt fine. I called a volunteer via BeMyEyes and showed the suspicious veggie through my phone camera. “They look okay. Probably soft because you had kept them in the fridge for a long time,” he said.
During New Years’, when people send pictorial wishes to me, I export them on the SeeingAI app to get their description. I then use Microsoft Copilot to compose my return wishes. It allows me to convert text to image.
Bookshare.org, a low-cost accessible digital library for people with blindness and print disability (including those with dyslexia and autism), is my refuge. Donald Trump may not be the most liked public figure but during his first presidency term, his government signed an international treaty to make thousands of books published in America available on this website. Thanks to him, I could finally read ‘Different Seasons’ and ‘The Green Mile’ by Stephen King, ‘The Generals’ Daughter’ by Nelson DeMille, and the ‘Stephanie Plum’ series by Janet Evanovich.
On the job
“How do you sub (edit) news copies? There are words to delete, add, and substitute, and paras to shuffle,” the same colleague enquired. Yes, editing is more complex than composing and writing out an original article.
After I receive a draft over email, I paste it on a Word document and read it fully multiple times. I open another Word document on the side and transfer paras I want to edit, one by one and in the right order. Focussed listening is enough to spot oddly spelt words. I detest AutoCorrect. Yes, I take a little longer to edit but I compensate by taking fewer breaks at work.
“What happens when an email notification pops up or a delivery agent shows up at the door? Do you lose track of what the screen reader was saying?” the colleague asked. A few seconds of distractions don’t matter. But other times, I do have to go over the draft from the start! Earlier, the touchpad on my laptop was a source of such interruptions. My palms would unintentionally touch it and displace the cursor. Now I have bought an external keyboard to overcome the problem.
Looking up information online is a big part of editing. How do we browse a webpage? Keeping our screen reader on, we press keys such as H, L and B to jump to headings of different sizes, lists, and buttons (like search) respectively. But even if I hit the right sub-head on a Wikipedia page, for instance, I have to read it fully to extract precise information. I find ‘Ctrl + F’ command unwieldy on webpages that have too many interactive elements like hyperlinks.
Late vision loss
As a journalist, I am supposed to get behind stories, and not become the story. But I have put down my experiences because there’s considerable lack of awareness about how assistive devices are transforming the lives of people with blindness. I could be viewed as tech-savvy but the truth is a lot of blind persons like me use tech for survival. We see it as an extension of our body or a replacement of a part that’s impaired. I’ve helped many who have experienced late vision loss to explore technology because Braille could be difficult to grasp for adults, especially the elderly. Conditions such as retinitis pigmentosa, age-related macular degeneration or diabetic retinopathy can cause vision impairment later in life.
Also, I hope my experiences will inspire the sighted to look at the blind as people who can perform almost any task, as talents you can hire, as friends you can hang out with.
Optimistically speaking, our struggles with technology are not very different.
My screen reader stopped talking one fine afternoon when I was recording an interview over a Zoom call. My creaky old laptop began to hang as I had opened one too many windows. It felt like a power cut in my head. I shut down the system and rebooted it. Gladly I was able to recover all files.
MY WISHLIST
* Voice commands for generative AI chatbots like
ChatGPT, Google Gemini, Microsoft Copilot,
and Meta AI.
* Wearable tech powered
by augmented reality to describe surroundings on
demand.
* Real-time audio descriptions in movies
as a default feature, like subtitles.